Distilbert Base Uncased Mnli
A lightweight version of DistilBERT specifically designed for Multi-Genre Natural Language Inference (MNLI) tasks, based on the BERT architecture but with fewer parameters and higher efficiency.
Large Language Model
Transformers